1,616 research outputs found

    Coexistence of OFDM and FBMC for Underlay D2D Communication in 5G Networks

    Full text link
    Device-to-device (D2D) communication is being heralded as an important part of the solution to the capacity problem in future networks, and is expected to be natively supported in 5G. Given the high network complexity and required signalling overhead associated with achieving synchronization in D2D networks, it is necessary to study asynchronous D2D communications. In this paper, we consider a scenario whereby asynchronous D2D communication underlays an OFDMA macro-cell in the uplink. Motivated by the superior performance of new waveforms with increased spectral localization in the presence of frequency and time misalignments, we compare the system-level performance of a set-up for when D2D pairs use either OFDM or FBMC/OQAM. We first demonstrate that inter-D2D interference, resulting from misaligned communications, plays a significant role in clustered D2D topologies. We then demonstrate that the resource allocation procedure can be simplified when D2D pairs use FBMC/OQAM, since the high spectral localization of FBMC/OQAM results in negligible inter-D2D interference. Specifically, we identify that FBMC/OQAM is best suited to scenarios consisting of small, densely populated D2D clusters located near the encompassing cell's edge.Comment: 7 pages, 9 figures, Accepted at IEEE Globecom 2016 Workshop

    Financial modelling with 2-EPT probability density functions

    Get PDF
    The class of all Exponential-Polynomial-Trigonometric (EPT) functions is classical and equal to the Euler-d’Alembert class of solutions of linear differential equations with constant coefficients. The class of non-negative EPT functions defined on [0;1) was discussed in Hanzon and Holland (2010) of which EPT probability density functions are an important subclass. EPT functions can be represented as ceAxb, where A is a square matrix, b a column vector and c a row vector where the triple (A; b; c) is the minimal realization of the EPT function. The minimal triple is only unique up to a basis transformation. Here the class of 2-EPT probability density functions on R is defined and shown to be closed under a variety of operations. The class is also generalised to include mixtures with the pointmass at zero. This class coincides with the class of probability density functions with rational characteristic functions. It is illustrated that the Variance Gamma density is a 2-EPT density under a parameter restriction. A discrete 2-EPT process is a process which has stochastically independent 2-EPT random variables as increments. It is shown that the distribution of the minimum and maximum of such a process is an EPT density mixed with a pointmass at zero. The Laplace Transform of these distributions correspond to the discrete time Wiener-Hopf factors of the discrete time 2-EPT process. A distribution of daily log-returns, observed over the period 1931-2011 from a prominent US index, is approximated with a 2-EPT density function. Without the non-negativity condition, it is illustrated how this problem is transformed into a discrete time rational approximation problem. The rational approximation software RARL2 is used to carry out this approximation. The non-negativity constraint is then imposed via a convex optimisation procedure after the unconstrained approximation. Sufficient and necessary conditions are derived to characterise infinitely divisible EPT and 2-EPT functions. Infinitely divisible 2-EPT density functions generate 2-EPT Lévy processes. An assets log returns can be modelled as a 2-EPT Lévy process. Closed form pricing formulae are then derived for European Options with specific times to maturity. Formulae for discretely monitored Lookback Options and 2-Period Bermudan Options are also provided. Certain Greeks, including Delta and Gamma, of these options are also computed analytically. MATLAB scripts are provided for calculations involving 2-EPT functions. Numerical option pricing examples illustrate the effectiveness of the 2-EPT approach to financial modelling

    On Matching Users to Specialised MNOs in Service Tailored Networks of the Future

    Get PDF
    In this paper, we investigate a network model in which entities called subscription brokers group service level agreements with multiple specialised mobile network operators (SMNOs) into a single subscription bundle, with a fixed data allowance that can be used by the subscriber as needed across any of the SMNOs included in the bundle. Each SMNO operates a network that is designed to meet the demands of a particular service area or vertical industry. We demonstrate the performance benefits of such a model, allowing users to choose SMNOs according to the needs of the service that they are using. In particular, we focus on how to perform the matching between users and SMNOs in a bundle, adopting the Gale-Shapley matching algorithm. We argue that a stable matching is needed to ensure that both SMNOs and users are incentivised to adopt the broker-based model. We outline a framework based on the concept of utility for devising the preference lists of users, while the approach we propose for building the preference lists of SMNOs can differentiate between different classes of users based on the price they pay for their subscription. We evaluate the performance cost in terms of utility of achieving stability compared to a sum utility maximisation matching approach, showing that this cost is largely borne by the lower priority users. Overall, the proposed broker-based model performs at least as well as any one SMNO for lower priority users, and outperforms any one SMNO for higher priority users

    Rational Approximation of Transfer Functions for Non-Negative EPT Densities

    Get PDF
    International audienceAn Exponential-Polynomial-Trigonometric (EPT) function is defined on [0,∞) by a minimal realization (A, b, c). A stable non-negative EPT function of a fixed degree is fitted to the histogram of a large set of data using an L2 criterion. If we neglect the non-negativity constraint this is shown to be equivalent to a rational approximation problem which is approached using the RARL2 software. We show how, under the additional assumption of the existence of a strictly dominant real pole of the rational function, the non-negativity constraint on the EPT function can be imposed by performing a constraint convex optimization on b at each stage at which an (A, c) pair is determined. In this convex optimization step a recent generalized Budan-Fourier sequence approach to determine non-negativity of an EPT function on a finite interval plays a major role

    A dynamical model reveals gene co-localizations in nucleus

    Get PDF
    Co-localization of networks of genes in the nucleus is thought to play an important role in determining gene expression patterns. Based upon experimental data, we built a dynamical model to test whether pure diffusion could account for the observed co-localization of genes within a defined subnuclear region. A simple standard Brownian motion model in two and three dimensions shows that preferential co-localization is possible for co-regulated genes without any direct interaction, and suggests the occurrence may be due to a limitation in the number of available transcription factors. Experimental data of chromatin movements demonstrates that fractional rather than standard Brownian motion is more appropriate to model gene mobilizations, and we tested our dynamical model against recent static experimental data, using a sub-diffusion process by which the genes tend to colocalize more easily. Moreover, in order to compare our model with recently obtained experimental data, we studied the association level between genes and factors, and presented data supporting the validation of this dynamic model. As further applications of our model, we applied it to test against more biological observations. We found that increasing transcription factor number, rather than factory number and nucleus size, might be the reason for decreasing gene co-localization. In the scenario of frequency-or amplitude-modulation of transcription factors, our model predicted that frequency-modulation may increase the co-localization between its targeted genes

    HEP Community White Paper on Software trigger and event reconstruction

    Get PDF
    Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for software and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.Comment: Editors Vladimir Vava Gligorov and David Lang

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI
    corecore